Hessian Matrices via Automatic Differentiation
نویسندگان
چکیده
We investigate the computation of Hessian matrices via Automatic Differentiation, using a graph model and an algebraic model. The graph model reveals the inherent symmetries involved in calculating the Hessian. The algebraic model, based on Griewank and Walther’s state transformations [7], synthesizes the calculation of the Hessian as a formula. These dual points of view, graphical and algebraic, lead to a new framework for Hessian computation. This is illustrated by giving a new correctness proof for Griewank and Walther’s reverse Hessian algorithm [7, p. 157] and by developing edge pushing, a new truly reverse Hessian computation algorithm that fully exploits the Hessian’s symmetry. Computational experiments compare the performance of edge pushing on sixteen functions from the CUTE collection [1] against two algorithms available as drivers of the software ADOL-C [4, 8, 14], and the results are very promising.
منابع مشابه
Efficient (Partial) Determination of Derivative Matrices via Automatic Differentiation
In many scientific computing applications involving nonlinear systems or methods of optimization, a sequence of Jacobian or Hessian matrices is required. Automatic differentiation (AD) technology can be used to accurately determine these matrices, and it is well known that if these matrices exhibit a sparsity pattern (for all iterates), then not only can AD take advantage of this sparsity for s...
متن کاملReview of theory and implementation of hyper-dual numbers for first and second order automatic differentiation
In this review we present hyper-dual numbers as a tool for the automatic differentiation of computer programs via operator overloading. We start with a motivational introduction into the ideas of algorithmic differentiation. Then we illuminate the concepts behind operator overloading and dual numbers. Afterwards, we present hyper-dual numbers (and vectors) as an extension of dual numbers for th...
متن کاملADMAT: Automatic differentiation in MATLAB using object oriented methods
Differentiation is one of the fundamental problems in numerical mathematics. The solution of many optimization problems and other applications require knowledge of the gradient, the Jacobian matrix, or the Hessian matrix of a given function. Automatic differentiation (AD) is an upcoming powerful technology for computing the derivatives accurately and fast. ADMAT (Automatic Differentiation for M...
متن کاملREPORTS IN INFORMATICS ISSN 0333-3590 Graph Coloring in Optimization Revisited
We revisit the role of graph coloring in modeling problems that arise in efficient estimation of large sparse Jacobian and Hessian matrices using both finite difference (FD) and automatic differentiation (AD) techniques, in each case via direct methods. For Jacobian estimation using column partitioning, we propose a new coloring formulation based on a bipartite graph representation. This is com...
متن کاملRelating Meris Fapar Products to Radiation Transfer Schemes Used in Climate/numerical Weather Prediction and Carbon Models
The main goal of this study is to contribute bridging the gap between available remote sensing products and large-scale models. We present here results from the application of an inversion method designed to assimilate various remote sensing surface flux products, e.g. albedos from Terra and FAPAR from ENVISAT, into a a state of the art plane-parallel (2-stream) radiation transfer scheme. This ...
متن کامل